Hyperparameter optimization: Foundations, algorithms, best practices, and open challenges

نویسندگان

چکیده

Most machine learning algorithms are configured by a set of hyperparameters whose values must be carefully chosen and which often considerably impact performance. To avoid time-consuming irreproducible manual process trial-and-error to find well-performing hyperparameter configurations, various automatic optimization (HPO) methods—for example, based on resampling error estimation for supervised learning—can employed. After introducing HPO from general perspective, this paper reviews important methods, simple techniques such as grid or random search more advanced methods like evolution strategies, Bayesian optimization, Hyperband, racing. This work gives practical recommendations regarding choices made when conducting HPO, including the themselves, performance evaluation, how combine with pipelines, runtime improvements, parallelization. article is categorized under: Algorithmic Development > Statistics Technologies Machine Learning Prediction

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Open Source Best Practices

developed differently from how corporations typically develop software. Research into how open source works has been growing steadily.1 One driver of such research is the desire to understand how commercial software development could benefit from open source best practices. Do some of these practices also work within corporations? If so, what are they, and how can we transfer them? This article...

متن کامل

Non-stochastic Best Arm Identification and Hyperparameter Optimization

Motivated by the task of hyperparameter optimization, we introduce the non-stochastic bestarm identification problem. Within the multiarmed bandit literature, the cumulative regret objective enjoys algorithms and analyses for both the non-stochastic and stochastic settings while to the best of our knowledge, the best-arm identification framework has only been considered in the stochastic settin...

متن کامل

Open Loop Hyperparameter Optimization and Determinantal Point Processes

We propose the use of k-determinantal point processes in hyperparameter optimization via random search. Compared to conventional approaches where hyperparameter settings are sampled independently, a k-DPP promotes diversity. We describe an approach that transforms hyperparameter search spaces for efficient use with a k-DPP. Our experiments show significant benefits over uniform random search in...

متن کامل

Licensing Challenges and Best Practices in China

For companies seeking to operate and expand in China, as in any market, numerous permits, approvals, and reviews are required before they can proceed. From selling products to creating new manufacturing facilities, these processes—often referred to generally as “administrative licensing”—are necessary steps to invest, expand and conduct commercial operations in China. While administrative licen...

متن کامل

Scale-free network optimization: foundations and algorithms

We investigate the fundamental principles that drive the development of scalable algorithms for network optimization. Despite the significant amount of work on parallel and decentralized algorithms in the optimization community, the methods that have been proposed typically rely on strict separability assumptions for objective function and constraints. Beside sparsity, these methods typically d...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Wiley Interdisciplinary Reviews-Data Mining and Knowledge Discovery

سال: 2023

ISSN: ['1942-4787', '1942-4795']

DOI: https://doi.org/10.1002/widm.1484